53 research outputs found

    The Algorithmic Complexity of Modular Decomposition

    Get PDF
    Modular decomposition is a thoroughly investigated topic inmany areas such as switching theory, reliability theory, game theory andgraph theory. We propose an O(mn)-algorithm for the recognition of amodular set of a monotone Boolean function f with m prime implicantsand n variables. Using this result we show that the computation ofthe modular closure of a set can be done in time O(mn2). On the otherhand, we prove that the recognition problem for general Boolean functions is NP-complete. Moreover, we introduce the so called generalizedShannon decomposition of a Boolean functions as an efficient tool forproving theorems on Boolean function decompositions.computational complexity;Boolean functions;decomposition algorithm;modular decomposition;substitution decomposition

    Version Spaces and Generalized Monotone Boolean Functions

    Get PDF
    We consider generalized monotone functions f: X --> {0,1}defined for an arbitrary binary relationoperations research;ordinal classification;artificial intelligence;machine learning;partially defined Boolean functions

    Induction of Ordinal Decision Trees

    Get PDF
    This paper focuses on the problem of monotone decision trees from the point of view of the multicriteria decision aid methodology (MCDA). By taking into account the preferences of the decision maker, an attempt is made to bring closer similar research within machine learning and MCDA. The paper addresses the question how to label the leaves of a tree in a way that guarantees the monotonicity of the resulting tree. Two approaches are proposed for that purpose - dynamic and static labeling which are also compared experimentally. The paper further considers the problem of splitting criteria in the con- text of monotone decision trees. Two criteria from the literature are com- pared experimentally - the entropy criterion and the number of con criterion - in an attempt to find out which one fits better the specifics of the monotone problems and which one better handles monotonicity noise.monotone decision trees;noise;multicriteria decision aid;multicriteria sorting;ordinal classication

    Dualisation, decision lists and identification of monotone discrete functions

    Get PDF
    Many data-analysis algorithms in machine learning, datamining and a variety of other disciplines essentially operate on discrete multi-attribute data sets. By means of discretisation or binarisation also numerical data sets can be successfully analysed. Therefore, in this paper we view/introduce the theory of (partially defined) discrete functions as an important theoretical tool for the analysis of multi-attribute data sets. In particular we study monotone (partially defined) discrete functions. Compared with the theory of Boolean functions relatively little is known about (partially defined) monotone discrete functions. It appears that decision lists are useful for the representation of monotone discrete functions. Since dualisation is an important tool in the theory of (monotone) Boolean functions, we study the interpretation and properties of the dual of a (monotone) binary or discrete function. We also introduce the dual of a pseudo-Boolean function. The results are used to investigate extensions of partially defined monotone discrete functions and the identification of monotone discrete functions. In particular we present a polynomial time algorithm for the identification of so-called stable discrete functions

    On n-isoclinic Groups

    Get PDF

    Modular Decomposition of Boolean Functions

    Get PDF
    Modular decomposition is a thoroughly investigated topic in many areas such as switching theory, reliability theory, game theory and graph theory. Most appli- cations can be formulated in the framework of Boolean functions. In this paper we give a uni_ed treatment of modular decomposition of Boolean functions based on the idea of generalized Shannon decomposition. Furthermore, we discuss some new results on the complexity of modular decomposition. We propose an O(mn)- algorithm for the recognition of a modular set of a monotone Boolean function f with m prime implicants and n variables. Using this result we show that the the computation of the modular closure of a set can be done in time O(mn2). On the other hand, we prove that the recognition problem for general Boolean functions is coNP-complete

    Instance-Based penalization techniques for classification

    Get PDF
    Several instance-based large-margin classi¯ers have recentlybeen put forward in the literature: Support Hyperplanes, Nearest ConvexHull classifier, and Soft Nearest Neighbor. We examine those techniquesfrom a common fit-versus-complexity framework and study the links be-tween them. Finally, we compare the performance of these techniquesvis-a-vis each other and other standard classification methods.

    Solving and interpreting binary classification problems in marketing with SVMs

    Get PDF
    Marketing problems often involve inary classification of customers into ``buyers'' versus ``non-buyers'' or ``prefers brand A'' versus ``prefers brand B''. These cases require binary classification models such as logistic regression, linear, andquadratic discriminant analysis. A promising recent technique forthe binary classification problem is the Support Vector Machine(Vapnik (1995)), which has achieved outstanding results in areas ranging from Bioinformatics to Finance. In this paper, we compare the performance of the Support Vector Machine against standard binary classification techniques on a marketing data set and elaborate on the interpretation of the obtained results.

    Classification with support hyperplanes

    Get PDF
    A new classification method is proposed, called Support Hy-perplanes (SHs). To solve the binary classification task, SHs consider theset of all hyperplanes that do not make classification mistakes, referredto as semi-consistent hyperplanes. A test object is classified using thatsemi-consistent hyperplane, which is farthest away from it. In this way, agood balance between goodness-of-fit and model complexity is achieved,where model complexity is proxied by the distance between a test objectand a semi-consistent hyperplane. This idea of complexity resembles theone imputed in the width of the so-called margin between two classes,which arises in the context of Support Vector Machine learning. Classoverlap can be handled via the introduction of kernels and/or slack vari-ables. The performance of SHs against standard classifiers is promisingon several widely-used empirical data sets.Kernel methods;large margin and instance-based classifiers

    Nearest convex hull classification

    Get PDF
    Consider the classification task of assigning a test object toone of two or more possible groups, or classes. An intuitive way to proceedis to assign the object to that class, to which the distance is minimal. Asa distance measure to a class, we propose here to use the distance to theconvex hull of that class. Hence the name Nearest Convex Hull (NCH)classification for the method. Convex-hull overlap is handled through theintroduction of slack variables and kernels. In spirit and computationallythe method is therefore close to the popular Support Vector Machine(SVM) classifier. Advantages of the NCH classifier are its robustnessto outliers, good regularization properties and relatively easy handlingof multi-class problems. We compare the performance of NCH againststate-of-art techniques and report promising results.
    corecore